A hidden Markov model (HMM) extends the Markov model by adding a hidden state. This can be used to model natural phenomena, such as weather systems that have some sort of longer term meteorological state as well as the immediately measurable rain or shine. Formally, given a time series or sequence of observable events/tokens, while a simple Markov model has a matrix of probabilities Mjk giving the probability that event j comes after event k; instead the hidden Markov model has a more complex matrix Hjs;kt which gives the probabilty that given an observable event j and (unobservable) hidden state s, the next event will be k and the hidden state t.
Infering a HMM from training sequences is more difficult than training a plain Markov model as the hidden state is not avaiable in the training data and so must itself be fitted as part of the training process.
Note that higher order Markiv models (those that take into account several past observable events), which are easier to learn, can always be represented as HMMs by simply taking the hidden state to be the past tokens. This can be one way to bootstrap the HMM training process.
Used in Chap. 13: page 197; Chap. 14: pages 210, 211, 213, 218, 219; Chap. 17: page 262
Also known as HMM